Home | | Latest | About | Random
# Week 1 Wednesday
## Constant solutions (equilibrium solutions).
- Sometimes, a differential equation for $y=y(x)$ has a **constant** solution. Namely, a solution $y(x)=k$ for some fixed constant $k$ for all $x$. If so, then $y'=0$. A constant solution is also called an **equilibrium solution**.
- Find all constant solutions to $y'=(1+y)(1+x)$. Note if $y=k$ is a constant, then we have $0 = (1+k)(1+x)$, which is true when $k=-1$. So we have a constant solution of $y=-1$. If $y =k$ for other constants that is not $-1$, then $(1+k)(1+x)$ is not always $0$, so there are no other constant solutions.
- Find all constant solutions to $\frac{dT}{dt}=k(T-T_{env})$. Here we see that $T(t)=T_{env}$ is the only constant solution.
- Find all constant solutions to $y'=(x+y)(y+4)(x+3)\cos(y)$. Here we see $y=-4$, $y=\frac{\pi}{2}+k\pi$ are all the constant solutions.
## Initial value problems (IVPs)
- We will pay attention to first-order differential equations of the form $y'=f(x,y)$, with a specified initial condition, $y(x_{0}) = y_{0}$. This is called an **initial value problem (IVP)** for a first-order DE.
- For example $y' = x + y$, $y(3) = 2$ is an initial value problem.
- We care about IVPs because the DE is a model of some system, and the initial condition is our starting point. A typical way to describe a problem.
## Do solutions to an IVP always exist? And are they unique?
- M... is also for monsters. Monster -- meaning "divine omen", or serve as warning. As in "demonstrate".
- Given a first-order initial value problem, is it always possible to find some solution to it? And if so are they unique?
- The answer is no to both. There are many weird functions out there...!
- Here is an example of **an IVP with no solutions what-so-ever**: $$
y'= \begin{cases}
1 & \text{if } x \text{ is rational} \\ 0 & \text{if } x \text{ is irrational}
\end{cases}
$$The function on the right $f(x) = \begin{cases}1 & \text{if } x \text{ is rational} \\ 0 & \text{if } x \text{ is irrational}\end{cases}$ is called **Dirichlet's function**. This is a _monster_ of a function, and quite often used as example or counterexample in math. As it turns out, no function $y=y(x)$ can have a derivative that is the Dirichlet's function.
- And the reason for that is it does not satisfy **Darboux's theorem**, where the derivative of a differentiable function must observe intermediate value property. And Dirichlet's function fails to obey intermediate value property.
- Also, note, it is possible to have differentiable functions to have derivatives that is discontinuous (and still satisfy Darboux). For example the function $f(x) = \begin{cases} x^{2}\sin(\frac{1}{x}) & \text{if } x\neq 0\\ 0 & \text{if } x =0\end{cases}$ is differentiable everywhere, but its derivative is not continuous at $x=0$.
- Bottomline, a differential equation $y'(x) = f(x,y)$ is not guaranteed to have a solution.
- Also, even **if a differential equation has a solution, it does not need to be unique, even if we specify an initial condition on it.** (So far, it seems like after we gat a 1-parameter family of solution, applying some initial conditions we can determine uniquely the parameter. But this is not the case.)
- For example, the differential equation for $y=y(x)$, where $y' = \sqrt{y}$ and $y(0) = 0$. Are there solutions whose interval of definition is $(-\infty, \infty)$?
- First of all, we can look for **constant solutions**. It is a solution where $y(x)=c$ is some constant. Well if $y=c$, then $y'=0$. And we see that $y(x) = 0$ works. This also satisfies the initial condition $y(0) = 0$. So we have a constant solution $y(x) = 0$, with interval of definition $(-\infty,\infty)$.
- But, this differential equation is also separable. Let us try our method. $dy / dx = \sqrt{y}$, so $dy / \sqrt{y} = dx$, which gives $2\sqrt{y}=x+C$, for some parameter $C$. We can explicitly solve for $y$ and get $y(x)=\frac{1}{4} (x+C)^{2}$. Using initial condition $y(0) = 0$, we get $C=0$. So $y=\frac{1}{4}x^{2}$. But wait something is not quite right: Here $y'=\frac{1}{2} x$ and $\sqrt{y} = \frac{1}{2} |x|$, so our solution only works on $x \ge 0$. We can actually patch the parts of $x < 0$ with the constant solution we found and get a solution for all $x$, namely $$
y(x) = \begin{cases}
\frac{1}{4}x^{2} & \text{if } \ge 0 \\ 0 & \text{if } x < 0
\end{cases}.
$$ This is a different solution than our constant solution, satisfying the same differential equation, with the same initial condition, and have interval of definition $(-\infty,\infty)$.
- But wait, there's more. In fact, we can _translate_ this function to the right, and we still get a solution on all of $\mathbb R$: For any $T > 0$, $$
y(x) = \begin{cases}
\frac{1}{4} (x-T)^{2} & \text{if } x \ge T \\ 0 & \text{if } x < T
\end{cases}.
$$ Drawing pictures of these will convince you. Here $y'(x) = \begin{cases}\frac{1}{2} (x-T) & \text{if } x \ge T \\ 0 & \text{if } x < T \end{cases}$, and $\sqrt{y(x)} =\begin{cases}\frac{1}{2} |x-T| & \text{if } x \ge T \\ 0 & \text{if } x < T \end{cases}$, which we indeed have $y'=\sqrt{y}$ on all of $\mathbb R$.
- Bottomline: This DE $y'=\sqrt{y}$ with initial condition $y(0)=0$ has infinitely many different solutions!
- Above example is in fact quite troublesome -- for physics. In physics one often models a system after some differential equation, and after specifying an initial condition, the system just evolve deterministically. However, above example shows that if our system is given by above DE and initial condition, we would not know how it would evolve! A more specific physics thought example is given by Norton's Dome. One way to resolve this is to declare above DE non-physical. But Norton's Dome example is a hypothetically a physical example. In my opinion, physics is incomplete and hard to capture exactly. Do we reject the notion of "clockwork universe" of determinism? Who knows.
## Existence and uniqueness theorems.
- Let us now talk about nice situations.
- In the $y' =$ Dirichlet's function example, the derivative was terrible, causing it to have no solution what-so-ever. As it turns out, if $y'=f(x,y)$ and $f(x,y)$ is a nice function on an open set, then we can guarantee the existence of some local solution. What is this condition?
- **Peano's existence theorem.** Suppose $f(x,y)$ is continuous on an open rectangle $D = (a,b)\times(c,d)$ in $\mathbb R^{2}$, then for any point $(x_{0},y_{0}) \in D$, there exists a local solution $y = y(x)$ to the IVP $y'=f(x,y)$ and $y(x_{0}) = y_{0}$ with an interval of definition of $(x_{0}-\epsilon,x_{0}+\epsilon)$ for some small $\epsilon > 0$. This solution need not be unique.
- A local solution is a solution whose interval of definition is some $\pm \epsilon$ around the specified point.
- Example. For $y'= \frac{xy}{x^{2}+y^{2}+1}$ and $y(3) = 2$. Using Peano's theorem, what can you say?
- Answer. Well, the function $f(x,y) = \frac{xy}{x^{2}+y^{2}+1}$ is continuous on an open rectangle $D=(1,3)\times(2,4)$ containing $(2,3)$ (Recall when $f(x,y)$ is a rational function, it is continuous whenever it is defined.) Then by Peano's existence theorem, there exists some local solution to this IVP.
- Example. For $y' = \sqrt{x-y}$, $y(0)=0$. Using Peano's theorem, what can you say?
- Answer. Well, the function $f(x,y)=\sqrt{x-y}$ is bad when $y > x$. Drawing this on the plane, we cannot find an open rectangle containing $(0,0)$ where $f$ is continuous. So Peano's theorem _does not apply_ and so we are left inconclusive. Unclear if there is a solution (**yet**!).
- Example. For $y' = \frac{x^{2}+e^y}{2+\sin(xy)}$ and $y(12)=100$. Is there a solution to this IVP?
- Answer. This function $f(x,y) = \frac{x^{2}+e^{y}}{2+\sin(xy)}$ is continuous on an open rectangle $D=(11,13)\times(99,101)$ containing $(12,100)$. So by Peano's, there exists a local solution to this IVP.
- Bottomline. If we have an IVP $y'=f(x,y)$ and $y(x_{0})=y_{0}$, where $f$ is continuous on an open rectangle $D$ containing $(x_{0},y_{0})$, then Peano's guarantees there exists some local solution. Nothing is said about uniqueness. If hypothesis is not met, then it is inconclusive by the theorem (it is not an if and only if statement).
- Is there a way to guarantee uniqueness of this local solution? Yes, if $f(x,y)$ is even nicer.
- **Picard's existence and uniqueness theorem.** Suppose $f(x,y)$ **and** $\frac{\partial f}{\partial y}$ are both continuous on an open rectangle $D = (a,b)\times(c,d)$, then for any point $(x_{0},y_{0}) \in D$, there exists a unique local solution $y=y(x)$ to the initial value problem $y'=f(x,y)$ and $y(x_{0}) = y_{0}$ with domain of definition $(x_{0}-\epsilon,x_{0} + \epsilon)$, for some small enough $\epsilon > 0$.
- This is basically Peano's with an additional check of continuity of $\partial f / \partial y$ in our open rectangle.
- Example. For $y'= \frac{xy}{x^{2}+y^{2}+1}$ and $y(3) = 2$. What you can say?
- Answer. Note that for $f(x,y) = \frac{xy}{x^{2}+ y^{2} + 1}$, we have $\frac{\partial f}{\partial y} = \frac{(x^{2}+y^{2}+1)x - xy(2y)}{(x^{2}+y^{2}+1)^{2}}$ are both continuous on $D = (1,3) \times(2,4)$, so there exists a unique local solution by Picard's existence and uniqueness theorem.
- Example. For $y' = \sqrt{y}$, $y(0)=0$, what can we say?
- Answer. Note for $f(x,y) = \sqrt{y}$, $f_{y} = \frac{1}{2 \sqrt{y}}$, which is not continuous at $(0,0)$. So we cannot say anything about uniqueness of a local solution as we cannot draw an open rectangle around $(0,0)$ where $f_{y}$ is continuous. And there is no open rectangle about $(0,0)$ where $f(x,y)=\sqrt{y}$ is continuous either. So we can't even say if a local solution exists or not. We are left inconclusive with both theorems.
- Bottomline. If we have an IVP $y'=f(x,y)$ and $y(x_{0})=y_{0}$, where $f$ and $f_{y}$ are both continuous on an open rectangle $D$ containing $(x_{0},y_{0})$, then Picard's guarantees there exists a unique local solution. If hypothesis is not met, then it is inconclusive by the theorem (it is not an if and only if statement).
- An example of an IVP having a unique global solution, but fails Picard's theorem hypothesis: For $y=y(x)$, $y'= \begin{cases} 10 & \text{if } y > 0 \\ 0 & \text{if } y = 0 \\ -10 & \text{if }y < 0\end{cases}$, and $y(0) = 0$. Note $y(x) =0$ is a global (constant) solution, and it is the unique global solution. [[1 teaching/smc-summer-2025-math15/unique-global-solution-but-fails-picards|unique-global-solution-but-fails-picards]]
- **Geometric consequence of uniqueness.** If a first-order IVP $y'=f(x,y)$ with $y(x_{0})=y_{0}$ has a unique local solution, then there is only one possible solution curve $y(x)$ locally near the point $(x_{0},y_{0})$, satisfying the initial condition.
## Picard's iterates.
We now describe an iterative method to construct a solution to an IVP. This is in fact part of the proof of why Picard's existence and uniqueness theorem is true.
Given $y'=f(x,y)$ and initial condition $y(x_{0}) = y_{0}$.
We will construct a **sequence** of functions, $y_{0}(x), y_{1}(x), y_{2}(x), ...$ and **hope** these functions converge to a solution to the given IVP.
First we observe by FTC, we need to satisfy the integral equation $y(x)=y(x_{0}) + \int_{x_{0}}^{x} f(t,y(t))dt$. What we will do is iteratively plug in an estimate $y_{k}(x)$ of the solution $y(x)$ to get a new estimate $y_{k+1}(x)$.
Step 0. $y_{0}(x) = y_{0}$, the constant $y_{0}$.
Step 1. $y_{1} = y_{0} + \int_{x_{0}}^{x}f(t,y_{0}(t))dt$
Step 2. $y_{2} = y_{0} + \int_{x_{0}}^{x}f(t,y_{1}(t))dt$
...
Step $k+1$. $y_{k+1} = y_{0} + \int_{x_{0}}^{x} f(t,y_{k}(t)) dt$
These functions $y_{k}(x)$ are called **Picard iterates**.
And, **if** $y_{k}(x)$ converges, then we **hope** it converges to a solution to our IVP. As it turns out, the hypothesis of Picard's theorem guarantees this converges to a unique local solution!
Also, this gives a possible way to approximate a solution, iteratively.
Example. $y'=1$, $y(3)=1$.
$y_{0}(x)=1$
$y_{1}(x) = 1 + \int_{3}^{x} 1 dt =1+x-3=x-2$.
$y_{2}(x)=1+\int_{3}^{x} 1dt = x-2$.
...
So we see $y_{k}(x) = x-2$ for all $k$, these functions converge to $y(x)=x-2$, which happens to be the solution to our IVP!
Remark. If $f(x,y)$ does not depend on $y$ at all, then we have just an FTC problem, the Picard iterates converges after one step!
Example. $y'=y$, $y(3) = 1$. Here $f(x,y)=y$.
$y_{0}(x) = 1$
$y_{1}(x) = 1 + \int_{3}^{x}f(t,y_{0}(t))dt =1+\int_{3}^{x}1dt=1+x-3=x-2$
$y_{2}(x)=1+\int_{3}^{x} f(t,t-2)dt= 1+\int_{3}^{x}(t-2) dt =1+\left[ \frac{t^{2}}{2}-2t \right]_{t=3}^{x} =1+\frac{x^{2}}{2}-2x -\frac{9}{2}+6$
$= \frac{5}{2}-2x+ \frac{x^{2}}{2}$
$y_{3}(x) = 1 + \int_{3}^{x} \frac{5}{2}-2t + \frac{t^{2}}{2}dt$
$-2 + \frac{5 x}{2} - x^2 + \frac{x^3}{6}$
...
If we plot $y_{3}(x)$, against what we know about this IVP, which should have solution of $y(x)=\frac{1}{e^{3}}e^{x}$, we will see that they are remarkably close near $x=3$!
To summarize:
(1) Given an IVP $y'=f(x,y)$ and $y(x_{0}) = y_{0}$, we can constructive these Picard iterates $y_{0}(x) = y_{0}$ and $y_{n+1}(x) = y_{0} + \int_{x_{0}}^{x}f(t,y_{n}(t))dt$ for all $n\ge 1$. This comes from by taking our first estimation of a solution, and plug into the integral equation to get a new estimate of the solution.
(2) If the IVP actually satisfies the hypothesis in Picard's theorem, then the iterates converges to a unique local solution! Hence these iterates can be useful to approximate the true solution.